118 research outputs found

    Continuous Transmission of Spatially Coupled LDPC Code Chains

    Get PDF
    We propose a novel encoding/transmission scheme called continuous chain (CC) transmission that is able to improve the finite-length performance of a system using spatially coupled low-density parity-check (SC-LDPC) codes. In CC transmission, instead of transmitting a sequence of independent code words from a terminated SC-LDPC code chain, we connect multiple chains in a layered format, where encoding, transmission, and decoding are performed in a continuous fashion. The connections between chains are created at specific points, chosen to improve the finite-length performance of the code structure under iterative decoding. We describe the design of CC schemes for different SC-LDPC code ensembles constructed from protographs: a (J,K) -regular SC-LDPC code chain, a spatially coupled repeat-accumulate (SC-RA) code, and a spatially coupled accumulate-repeat-jagged-accumulate (SC-ARJA) code. In all cases, significant performance improvements are reported and it is shown that using CC transmission only requires a small increase in decoding complexity and decoding delay with respect to a system employing a single SC-LDPC code chain for transmission.This material is based upon work supported in part by the National Science Foundation under Grant Nos. CCF-1161754 and CCSS-1710920, in part by NSERC Canada, and in part by the Spanish Ministry of Economy and Competitiveness and the Spanish National Research Agency under grants TEC2016-78434-C3-3-R (AEI/FEDER, EU) and Juan de la Cierva Fellowship IJCI-2014-19150

    Spatially coupled generalized LDPC codes: asymptotic analysis and finite length scaling

    Get PDF
    Generalized low-density parity-check (GLDPC) codes are a class of LDPC codes in which the standard single parity check (SPC) constraints are replaced by constraints defined by a linear block code. These stronger constraints typically result in improved error floor performance, due to better minimum distance and trapping set properties, at a cost of some increased decoding complexity. In this paper, we study spatially coupled generalized low-density parity-check (SC-GLDPC) codes and present a comprehensive analysis of these codes, including: (1) an iterative decoding threshold analysis of SC-GLDPC code ensembles demonstrating capacity approaching thresholds via the threshold saturation effect; (2) an asymptotic analysis of the minimum distance and free distance properties of SC-GLDPC code ensembles, demonstrating that the ensembles are asymptotically good; and (3) an analysis of the finite-length scaling behavior of both GLDPC block codes and SC-GLDPC codes based on a peeling decoder (PD) operating on a binary erasure channel (BEC). Results are compared to GLDPC block codes, and the advantages and disadvantages of SC-GLDPC codes are discussed.This work was supported in part by the National Science Foundation under Grant ECCS-1710920, Grant OIA-1757207, and Grant HRD-1914635; in part by the European Research Council (ERC) through the European Union's Horizon 2020 research and innovation program under Grant 714161; and in part by the Spanish Ministry of Science, Innovation and University under Grant TEC2016-78434-C3-3-R (AEI/FEDER, EU)

    Randomly Punctured LDPC Codes

    Get PDF
    In this paper, we present a random puncturing analysis of low-density parity-check (LDPC) code ensembles. We derive a simple analytic expression for the iterative belief propagation (BP) decoding threshold of a randomly punctured LDPC code ensemble on the binary erasure channel (BEC) and show that, with respect to the BP threshold, the strength and suitability of an LDPC code ensemble for random puncturing is completely determined by a single constant that depends only on the rate and the BP threshold of the mother code ensemble. We then provide an efficient way to accurately predict BP thresholds of randomly punctured LDPC code ensembles on the binary- input additive white Gaussian noise channel (BI-AWGNC), given only the BP threshold of the mother code ensemble on the BEC and the design rate, and we show how the prediction can be improved with knowledge of the BI-AWGNC threshold. We also perform an asymptotic minimum distance analysis of randomly punctured code ensembles and present simulation results that confirm the robust decoding performance promised by the asymptotic results. Protograph-based LDPC block code and spatially coupled LDPC code ensembles are used throughout as examples to demonstrate the results

    Randomly Punctured Spatially Coupled LDPC Codes

    Get PDF
    In this paper, we study random puncturing of protograph-based spatially coupled low-density parity-check (SC- LDPC) code ensembles. We show that, with respect to iterative decoding threshold, the strength and suitability of an LDPC code ensemble for random puncturing over the binary erasure channel (BEC) is completely determined by a single constant that depends only on the rate and iterative decoding threshold of the mother code ensemble. We then use this analysis to show that randomly punctured SC-LDPC code ensembles display near capacity thresholds for a wide range of rates. We also perform an asymptotic minimum distance analysis and show that, like the SC-LDPC mother code ensemble, the punctured SC-LDPC code ensembles are also asymptotically good. Finally, we present some simulation results that confirm the excellent decoding performance promised by the asymptotic results

    Analysis of Sample Correlations for Monte Carlo Rendering

    Get PDF
    Modern physically based rendering techniques critically depend on approximating integrals of high dimensional functions representing radiant light energy. Monte Carlo based integrators are the choice for complex scenes and effects. These integrators work by sampling the integrand at sample point locations. The distribution of these sample points determines convergence rates and noise in the final renderings. The characteristics of such distributions can be uniquely represented in terms of correlations of sampling point locations. Hence, it is essential to study these correlations to understand and adapt sample distributions for low error in integral approximation. In this work, we aim at providing a comprehensive and accessible overview of the techniques developed over the last decades to analyze such correlations, relate them to error in integrators, and understand when and how to use existing sampling algorithms for effective rendering workflows.publishe

    Developing a predictive modelling capacity for a climate change-vulnerable blanket bog habitat: Assessing 1961-1990 baseline relationships

    Get PDF
    Aim: Understanding the spatial distribution of high priority habitats and developing predictive models using climate and environmental variables to replicate these distributions are desirable conservation goals. The aim of this study was to model and elucidate the contributions of climate and topography to the distribution of a priority blanket bog habitat in Ireland, and to examine how this might inform the development of a climate change predictive capacity for peat-lands in Ireland. Methods: Ten climatic and two topographic variables were recorded for grid cells with a spatial resolution of 1010 km, covering 87% of the mainland land surface of Ireland. Presence-absence data were matched to these variables and generalised linear models (GLMs) fitted to identify the main climatic and terrain predictor variables for occurrence of the habitat. Candidate predictor variables were screened for collinearity, and the accuracy of the final fitted GLM was evaluated using fourfold cross-validation based on the area under the curve (AUC) derived from a receiver operating characteristic (ROC) plot. The GLM predicted habitat occurrence probability maps were mapped against the actual distributions using GIS techniques. Results: Despite the apparent parsimony of the initial GLM using only climatic variables, further testing indicated collinearity among temperature and precipitation variables for example. Subsequent elimination of the collinear variables and inclusion of elevation data produced an excellent performance based on the AUC scores of the final GLM. Mean annual temperature and total mean annual precipitation in combination with elevation range were the most powerful explanatory variable group among those explored for the presence of blanket bog habitat. Main conclusions: The results confirm that this habitat distribution in general can be modelled well using the non-collinear climatic and terrain variables tested at the grid resolution used. Mapping the GLM-predicted distribution to the observed distribution produced useful results in replicating the projected occurrence of the habitat distribution over an extensive area. The methods developed will usefully inform future climate change predictive modelling for Irelan

    International Lower Limb Collaborative (INTELLECT) study : a multicentre, international retrospective audit of lower extremity open fractures

    Get PDF

    Performance of reconstruction and identification of τ leptons decaying to hadrons and vτ in pp collisions at √s=13 TeV

    Get PDF
    The algorithm developed by the CMS Collaboration to reconstruct and identify τ leptons produced in proton-proton collisions at √s=7 and 8 TeV, via their decays to hadrons and a neutrino, has been significantly improved. The changes include a revised reconstruction of π⁰ candidates, and improvements in multivariate discriminants to separate τ leptons from jets and electrons. The algorithm is extended to reconstruct τ leptons in highly Lorentz-boosted pair production, and in the high-level trigger. The performance of the algorithm is studied using proton-proton collisions recorded during 2016 at √s=13 TeV, corresponding to an integrated luminosity of 35.9 fbÂŻÂč. The performance is evaluated in terms of the efficiency for a genuine τ lepton to pass the identification criteria and of the probabilities for jets, electrons, and muons to be misidentified as τ leptons. The results are found to be very close to those expected from Monte Carlo simulation

    An embedding technique to determine ττ backgrounds in proton-proton collision data

    Get PDF

    Measurement of the azimuthal anisotropy of Y(1S) and Y(2S) mesons in PbPb collisions at √S^{S}NN = 5.02 TeV

    Get PDF
    The second-order Fourier coefficients (υ2_{2}) characterizing the azimuthal distributions of ΄(1S) and ΄(2S) mesons produced in PbPb collisions at sNN\sqrt{s_{NN}} = 5.02 TeV are studied. The ΄mesons are reconstructed in their dimuon decay channel, as measured by the CMS detector. The collected data set corresponds to an integrated luminosity of 1.7 nb−1^{-1}. The scalar product method is used to extract the υ2_{2} coefficients of the azimuthal distributions. Results are reported for the rapidity range |y| < 2.4, in the transverse momentum interval 0 < pT_{T} < 50 GeV/c, and in three centrality ranges of 10–30%, 30–50% and 50–90%. In contrast to the J/ψ mesons, the measured υ2_{2} values for the ΄ mesons are found to be consistent with zero
    • 

    corecore